Sequential Importance Sampling for NonparametricBayes Models : The Next GenerationRunning

نویسندگان

  • Steven N. MacEachern
  • Merlise Clyde
  • Jun S. Liu
  • Steve MacEachern
چکیده

There are two generations of Gibbs sampling methods for semi-parametric models involving the Dirichlet process. The rst generation suuered from a severe drawback; namely that the locations of the clusters, or groups of parameters, could essentially become xed, moving only rarely. Two strategies that have been proposed to create the second generation of Gibbs samplers are integration and appending a second stage to the Gibbs sampler wherein the cluster locations are moved. We show that these same strategies are easily implemented for the sequential importance sampler, and that the rst strategy dramatically improves results. As in the case of Gibbs sampling, these strategies are applicable to a much wider class of models. They are shown to provide more uniform importance sampling weights and lead to additional Rao-Blackwellization of estimators. There are two generations of Gibbs sampling methods for semi-parametric models involving the Dirichlet process. The rst generation suuered from a severe drawback; namely that the locations of the clusters, or groups of parameters, could essentially become xed, moving only rarely. Two strategies that have been proposed to create the second generation of Gibbs samplers are integration and appending a second stage to the Gibbs sampler wherein the cluster locations are moved. We show that these same strategies are easily implemented for the sequential importance sampler, and that the rst strategy dramatically improves results. As in the case of Gibbs sampling, these strategies are applicable to a much wider class of models. They are shown to provide more uniform importance sampling weights and lead to additional Rao-Blackwellization of estimators.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential Importance Sampling for Nonparametric Bayes Models: The Next Generation

There are two generations of Gibbs sampling methods for semiparametric models involving the Dirichlet process. The first generation suffered from a severe drawback: the locations of the clusters, or groups of parameters, could essentially become fixed, moving only rarely. Two strategies that have been proposed to create the second generation of Gibbs samplers are integration and appending a sec...

متن کامل

Fitting general stochastic volatility models using Laplace accelerated sequential importance sampling

Simulated maximum likelihood has proved to be a valuable tool for fitting the log-normal stochastic volatility model to financial returns time series. In this paper, we develop a methodology that generalizes these methods to more general stochastic volatility models that are naturally cast in terms of a positive volatility process. The methodology relies on combining two well known methods for ...

متن کامل

Resampling: An improvement of importance sampling in varying population size models.

Sequential importance sampling algorithms have been defined to estimate likelihoods in models of ancestral population processes. However, these algorithms are based on features of the models with constant population size, and become inefficient when the population size varies in time, making likelihood-based inferences difficult in many demographic situations. In this work, we modify a previous...

متن کامل

A Sequential Monte Carlo Approach to Computing Tail Probabilities in Stochastic Models

Sequential Monte Carlo methods which involve sequential importance sampling and resampling are shown to provide a versatile approach to computing probabilities of rare events. By making use of martingale representations of the sequential Monte Carlo estimators, we show how resampling weights can be chosen to yield logarithmically efficient Monte Carlo estimates of large deviation probabilities ...

متن کامل

Sequential Monte Carlo samplers for Bayesian DSGE models

Bayesian estimation of DSGE models typically uses Markov chain Monte Carlo as importance sampling (IS) algorithms have a difficult time in high-dimensional spaces. I develop improved IS algorithms for DSGE models using recent advances in Monte Carlo methods known as sequential Monte Carlo samplers. Sequential Monte Carlo samplers are a generalization of particle filtering designed for full simu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998